Digital Privacy
01 Aug 2013 Jim Killock censorship
Diane Abbott responds on web forum blocking
On a cycling forum, members who are rightly worried that their forum may be blocked by default filters, Skydancer posted a response he was given by Diane Abbott:
I do not believe that the arrangements to protect children from hard core porn online will affect a forum to discuss cycling! I think that men, who think that viewing hard core porn without let or hindrance is some kind of human right, are deliberately exaggerating the effect of the suggested arrangements
I asked Diane Abbott about this, and to her credit she replied. The conversation was private, so I won’t quote her replies, but I think it is common knowledge that Diane believes that default filters are more effective. I think it is also common knowledge that she is prioritising child safety and wants everything done possible to block children’s access to pornography as something very seriously harmful. However, in the statement above we are clear that she has made a factual mistake. Forums and social media are very likely to be the target of default parental filters.
The question for Diane Abbott should be, given categories like web forums are likely to be blocked by some of the filters and any box is very likely to be pre-selected, does Diane believe these sites should be blocked by default? Or does she not care?
But it isn’t just her policy. Claire Perry and David Cameron have taken credit for it, and the ISPs have agreed to it.
Let’s define the intent as:
To limit access of children to pornography by maximising the number of households where pornography is blocked for children
and examine the “nudge censorship” policy against that.
Problem one: breadth for adults and children
The “pre-selected” categories may include very broad types of content, such as “alcohol, drugs and tobacco,” “social media” and “web forums”. The impact is broader than just “pornography”.
Let’s measure this purely against the objective above. Firstly, it is irrelevant to it. Secondly, if web filters appear to be blocking too much content, there is surely a danger that some parents will simply switch them back off, not least because their children constantly ask for sites to be unblocked.
Problem two: collateral damage
Collateral damage in this case means yes, climbing and cycling clubs, pubs, bars, being blocked. We can add mistakes to this as well.
Collateral damage is something that any sensible policy should minimise, from a public policy point of view. It is no good to say “your climbing club is worth less than my child”, a decently designed policy should aim to meet the needs of both.
Problem three: commercial sites want access to their market, adults want access to porn
This is a problem where the outcome is difficult to predict. If 30% of your potential market is suddenly lost to you, as adults in households with children find porn blocked, then these publishers may go to efforts to get it back. This might make filtering less effective, we don’t know. One can imagine apps being a means to distribute, for instance. Perhaps pornographic spam will become more popular, reaching children indiscriminately.
Equally, adults who find pornography blocked are placed in a difficult situation, where the outcome for children is again unpredictable. Perhaps irresponsible parents will simply switch the filters off, leaving vulnerable children even more vulnerable. Perhaps, if filters were targeted at children, and aimed to keep adults out of them, this would be less of a worry.
What we are dealing with is software design, not social engineering
What I am getting at is that government insisting that certain ways of setting up a software system, like specifying that categories are “pre-selected”, that buttons must say “next”, that filtering must be network based, is simply to make a category error.
Government has an objective, but is negotiating the form of the user experience as if it had omniscient knowledge of user behaviour. It doesn’t – it has a few gut instincts. It needs to separate the objective of limiting children’s access to pornography from the means to deliver it, especially as we get into the details.
In government policy terms, it is extraordinary for Claire Perry and David Cameron to be staking their reputations on whether or not boxes are “pre-selected” and whether forced choices (active choice) is better than a kind of default option.
Lawrence Lessig famously said that “Code is Law”, meaning that the operations of machines increasingly determine social outcomes, such as automatic content identification and DMCA takedowns.
Without really knowing why, politicians are stumbling on this concept, and becoming amateur software UX (user experience) designers. Unpredictable consequences will ensue. Please sign the petition!
Footnote: interestingly, Richard Thaler did his best yesterday to distance himself and the Number 10 unit from these proposals. I am not sure if he is correct (maybe this is ‘bastard-nudge’) but he clearly doesn’t want to be associated with “nudge censorship”.
Read more about the Check if your website is being blocked by filters campaign